主仓库
01-论文
1bit LLMS
模型量化
无损压缩
异常感知(论文原文)
Atom
AWQ
GPTQ
GPTQ(原文)
KIVI
KVquant
LLM.int8()
nnzip
OBC(高效压缩)
OLAccel
QLoRA
Qserve
Qserve(原文)
SmoothQuant
Tender
Zeroquant
03-读书笔记
2024年9月5日_《金字塔原理》读书报告
2024年9月28日_《逻辑的力量》
华为饱和营销攻击法读后感
人性的弱点
深度工作
数字极简
思维导图的作用
鱼没有脚
做读书笔记
index
created_date: 2024-11-29
modified_date: 2024-11-29

GPTQ

Frantar 等 - 2023 - GPTQ Accurate Post-Training Quantization for Generative Pre-trained Transformers-zh-01.webp

Frantar 等 - 2023 - GPTQ Accurate Post-Training Quantization for Generative Pre-trained Transformers-zh-02.webp

Frantar 等 - 2023 - GPTQ Accurate Post-Training Quantization for Generative Pre-trained Transformers-zh-03.webp

Frantar 等 - 2023 - GPTQ Accurate Post-Training Quantization for Generative Pre-trained Transformers-zh-04.webp

Frantar 等 - 2023 - GPTQ Accurate Post-Training Quantization for Generative Pre-trained Transformers-zh-05.webp

Frantar 等 - 2023 - GPTQ Accurate Post-Training Quantization for Generative Pre-trained Transformers-zh-06.webp

Frantar 等 - 2023 - GPTQ Accurate Post-Training Quantization for Generative Pre-trained Transformers-zh-07.webp

Frantar 等 - 2023 - GPTQ Accurate Post-Training Quantization for Generative Pre-trained Transformers-zh-08.webp

Frantar 等 - 2023 - GPTQ Accurate Post-Training Quantization for Generative Pre-trained Transformers-zh-09.webp

Frantar 等 - 2023 - GPTQ Accurate Post-Training Quantization for Generative Pre-trained Transformers-zh-10.webp

Frantar 等 - 2023 - GPTQ Accurate Post-Training Quantization for Generative Pre-trained Transformers-zh-11.webp

Frantar 等 - 2023 - GPTQ Accurate Post-Training Quantization for Generative Pre-trained Transformers-zh-12.webp

Frantar 等 - 2023 - GPTQ Accurate Post-Training Quantization for Generative Pre-trained Transformers-zh-13.webp

Frantar 等 - 2023 - GPTQ Accurate Post-Training Quantization for Generative Pre-trained Transformers-zh-14.webp

Frantar 等 - 2023 - GPTQ Accurate Post-Training Quantization for Generative Pre-trained Transformers-zh-15.webp

Frantar 等 - 2023 - GPTQ Accurate Post-Training Quantization for Generative Pre-trained Transformers-zh-16.webp

Table Of Contents
GPTQ